On Low-Space Differentially Private Low-rank Factorization in the Spectral Norm
نویسنده
چکیده
Low-rank factorization is used in many areas of computer science where one performs spectral analysis on large sensitive data stored in the form of matrices. In this paper, we study differentially private low-rank factorization of a matrix with respect to the spectral norm in the turnstile update model. In this problem, given an input matrix A ∈ Rm×n updated in the turnstile manner and a target rank k, the goal is to find two rank-k orthogonal matrices Uk ∈ Rm×k and Vk ∈ Rn×k, and one positive semidefinite diagonal matrix Σk ∈ Rk×k such that A ≈ UkΣkVT k with respect to the spectral norm. Our main contributions are two computationally efficient and sub-linear space algorithms for computing a differentially private low-rank factorization. We consider two levels of privacy. In the first level of privacy, we consider two matrices neighboring if their difference has a Frobenius norm at most 1. In the second level of privacy, we consider two matrices as neighboring if their difference can be represented as an outer product of two unit vectors. Both these privacy levels are stronger than those studied in the earlier papers such as Dwork et al. (STOC 2014), Hardt and Roth (STOC 2013), and Hardt and Price (NIPS 2014). As a corollary to our results, we get non-private algorithms that compute low-rank factorization in the turnstile update model with respect to the spectral norm. We note that, prior to this work, no algorithm that outputs low-rank factorization with respect to the spectral norm in the turnstile update model was known; i.e., our algorithm gives the first non-private low-rank factorization with respect to the spectral norm in the turnstile update mode. Our algorithms generate private linear sketches of the input matrix. Therefore, using the binary tree mechanism of Chan et al. (TISSEC: 14(3)) and Dwork et al. (STOC 2010), we get algorithms for continual release of low-rank factorization under both these privacy levels. This gives the first instance of differentially private algorithms with continual release that guarantees a stronger level of privacy than event-level privacy.
منابع مشابه
Fast and Space-optimal Low-rank Factorization in the Streaming Model With Application in Differential Privacy
In this paper, we consider the problem of computing a low-rank factorization of an m× n matrix in the general turnstile update model. We consider both the private and non-private setting. 1. In the non-private setting, we give a space-optimal algorithm that computes a low-rank factorization. Our algorithm maintains three sketches of the matrix instead of five as in Boutsidis et al. (STOC 2016)....
متن کاملLow-rank optimization with trace norm penalty
The paper addresses the problem of low-rank trace norm minimization. We propose an algorithm that alternates between fixed-rank optimization and rank-one updates. The fixed-rank optimization is characterized by an efficient factorization that makes the trace norm differentiable in the search space and the computation of duality gap numerically tractable. The search space is nonlinear but is equ...
متن کاملA General Model for Robust Tensor Factorization with Unknown Noise
Because of the limitations of matrix factorization, such as losing spatial structure information, the concept of low-rank tensor factorization (LRTF) has been applied for the recovery of a low dimensional subspace from high dimensional visual data. The low-rank tensor recovery is generally achieved by minimizing the loss function between the observed data and the factorization representation. T...
متن کاملReexamining Low Rank Matrix Factorization for Trace Norm Regularization
Trace norm regularization is a widely used approach for learning low rank matrices. A standard optimization strategy is based on formulating the problem as one of low rank matrix factorization which, however, leads to a non-convex problem. In practice this approach works well, and it is often computationally faster than standard convex solvers such as proximal gradient methods. Nevertheless, it...
متن کاملA Dual Framework for Low-rank Tensor Completion
One of the popular approaches for low-rank tensor completion is to use the latent trace norm as a low-rank regularizer. However, most of the existing works learn a sparse combination of tensors. In this work, we fill this gap by proposing a variant of the latent trace norm which helps to learn a non-sparse combination of tensors. We develop a dual framework for solving the problem of latent tra...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- CoRR
دوره abs/1611.08954 شماره
صفحات -
تاریخ انتشار 2016